Definition of West Indies

  • 1. The string of islands between North America and South America; a popular resort area Noun

Synonyms for word "West Indies"

Semanticaly linked words with "West Indies"